Rényi divergence and the central limit theorem

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kullback-Leibler Divergence and the Central Limit Theorem

This paper investigates the asymptotics of Kullback-Leibler divergence between two probability distributions satisfying a Central Limit Theorem property. The basic problem is as follows. Let Xi, i ∈ N, be a sequence of independent random variables such that the sum Sn = ∑n i=1 Xi has the same expected value and satisfies the CLT under each probability distribution. Then what are the asymptotics...

متن کامل

Central Limit Theorem in Multitype Branching Random Walk

A discrete time multitype (p-type) branching random walk on the real line R is considered. The positions of the j-type individuals in the n-th generation form a point process. The asymptotic behavior of these point processes, when the generation size tends to infinity, is studied. The central limit theorem is proved.

متن کامل

The Martingale Central Limit Theorem

One of the most useful generalizations of the central limit theorem is the martingale central limit theorem of Paul Lévy. Lévy was in part inspired by Lindeberg’s treatment of the central limit theorem for sums of independent – but not necessarily identically distributed – random variables. Lindeberg formulated what, in retrospect, is the right hypothesis, now known as the Lindeberg condition,1...

متن کامل

The Lindeberg central limit theorem

Theorem 1. If μ ∈P(R) has finite kth moment, k ≥ 0, then, writing φ = μ̃: 1. φ ∈ C(R). 2. φ(v) = (i) ∫ R x edμ(x). 3. φ is uniformly continuous. 4. |φ(v)| ≤ ∫ R |x| dμ(x). 1Charalambos D. Aliprantis and Kim C. Border, Infinite Dimensional Analysis: A Hitchhiker’s Guide, third ed., p. 515, Theorem 15.15; http://individual.utoronto.ca/ jordanbell/notes/narrow.pdf 2Onno van Gaans, Probability measu...

متن کامل

Rényi divergence and majorization

Rényi divergence is related to Rényi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of Rényi divergence, including it...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Annals of Probability

سال: 2019

ISSN: 0091-1798

DOI: 10.1214/18-aop1261